Non-monotonic Poisson Likelihood Maximization

نویسندگان

  • Suvrit Sra
  • Dongmin Kim
  • Bernhard Schölkopf
چکیده

This report summarizes the theory and some main applications of a new non-monotonic algorithm for maximizing a Poisson Likelihood, which for Positron Emission Tomography (PET) is equivalent to minimizing the associated Kullback-Leibler Divergence, and for Transmission Tomography is similar to maximizing the dual of a maximum entropy problem. We call our method non-monotonic maximum likelihood (NMML) and show its application to different problems such as tomography and image restoration. We discuss some theoretical properties such as convergence for our algorithm. Our experimental results indicate that speedups obtained via our non-monotonic methods are substantial.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Monotonically Overrelaxed EM Algorithms

We explore the idea of overrelaxation for accelerating the expectation-maximization (EM) algorithm, focusing on preserving its simplicity and monotonic convergence properties. It is shown that in many cases a trivial modification in the M-step results in an algorithm that maintains monotonic increase in the log-likelihood, but can have an appreciably faster convergence rate, especially when EM ...

متن کامل

Parameter estimation of Poisson generalized linear mixed models based on three different statistical principles: a simulation study

Generalized linear mixed models are flexible tools for modeling non-normal data and are useful for accommodating overdispersion in Poisson regression models with random effects. Their main difficulty resides in the parameter estimation because there is no analytic solution for the maximization of the marginal likelihood. Many methods have been proposed for this purpose and many of them are impl...

متن کامل

Maximum Conditional Likelihood via Bound Maximization and the CEM Algorithm

We present the CEM (Conditional Expectation Maximization) algorithm as an extension of the EM (Expectation Maximization) algorithm to conditional density estimation under missing data. A bounding and maximization process is given to speci cally optimize conditional likelihood instead of the usual joint likelihood. We apply the method to conditioned mixture models and use bounding techniques to ...

متن کامل

Space-alternating Generalized Em Algorithms for Penalized Maximum-likelihood Image Reconstruction

Most expectation-maximization (EM) type algorithms for penalized maximum-likelihood image reconstruction converge particularly slowly when one incorporates additive background effects such as scatter, random coincidences, dark current, or cosmic radiation. In addition, regularizing smoothness penalties (or priors) introduce parameter coupling, rendering intractable the M-steps of most EM-type a...

متن کامل

New Complete-Data Spaces and Faster Algorithms for Penalized- Likelihood Emission Tomography

The classical expectation-maximization (EM) algorithm for image reconstruction suffers from particularly slow convergence when additive background effects such as accidental coincidences and scatter are included. In addition, when smoothness penalties are included in the objective function, the M-step of the EM algorithm becomes intractable due to parameter coupling. Thie paper describes the sp...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008